Goto

Collaborating Authors

 combinatorial inference


Reviews: Combinatorial Inference against Label Noise

Neural Information Processing Systems

The combinatorial (meta- or super-class) idea is interesting: it is reasonable and one easily expects to work well. In terms of related work, I suggest add 2 related papers. One is ECOC (Solving Multiclass Learning Problems via Error-Correcting Output Codes, JAIR 1995), which is a classic combinatorial method for classification. The other one is PENCIL (Probabilistic End-to-end Noise Correction for Learning with Noisy Labels, CVPR 2019), which is a novel noise handling method. With regard to the method, the proposed probabilistic way to decipher class from meta-class is simple.


Combinatorial Inference against Label Noise

Neural Information Processing Systems

Label noise is one of the critical sources that degrade generalization performance of deep neural networks significantly. To handle the label noise issue in a principled way, we propose a unique classification framework of constructing multiple models in heterogeneous coarse-grained meta-class spaces and making joint inference of the trained models for the final predictions in the original (base) class space. Our approach reduces noise level by simply constructing meta-classes and improves accuracy via combinatorial inferences over multiple constituent classifiers. Since the proposed framework has distinct and complementary properties for the given problem, we can even incorporate additional off-the-shelf learning algorithms to improve accuracy further. We also introduce techniques to organize multiple heterogeneous meta-class sets using k -means clustering and identify a desirable subset leading to learn compact models.


Combinatorial Inference against Label Noise

Seo, Paul Hongsuck, Kim, Geeho, Han, Bohyung

Neural Information Processing Systems

Label noise is one of the critical sources that degrade generalization performance of deep neural networks significantly. To handle the label noise issue in a principled way, we propose a unique classification framework of constructing multiple models in heterogeneous coarse-grained meta-class spaces and making joint inference of the trained models for the final predictions in the original (base) class space. Our approach reduces noise level by simply constructing meta-classes and improves accuracy via combinatorial inferences over multiple constituent classifiers. Since the proposed framework has distinct and complementary properties for the given problem, we can even incorporate additional off-the-shelf learning algorithms to improve accuracy further. We also introduce techniques to organize multiple heterogeneous meta-class sets using $k$-means clustering and identify a desirable subset leading to learn compact models.


Combinatorial Inference for Graphical Models

Neykov, Matey, Lu, Junwei, Liu, Han

arXiv.org Machine Learning

We propose a new family of combinatorial inference problems for graphical models. Unlike classical statistical inference where the main interest is point estimation or parameter testing, combinatorial inference aims at testing the global structure of the underlying graph. Examples include testing the graph connectivity, the presence of a cycle of certain size, or the maximum degree of the graph. To begin with, we develop a unified theory for the fundamental limits of a large family of combinatorial inference problems. We propose new concepts including structural packing and buffer entropies to characterize how the complexity of combinatorial graph structures impacts the corresponding minimax lower bounds. On the other hand, we propose a family of novel and practical structural testing algorithms to match the lower bounds. We provide thorough numerical results on both synthetic graphical models and brain networks to illustrate the usefulness of these proposed methods.